Weights Direct Determination of Feedforward Neural Networks without Iterative BP-Training
نویسندگان
چکیده
Benefiting from parallel-processing nature, distributed storage, self-adaptive and self-learning abilities, artificial neural networks (ANN) have been investigated and applied widely in many scientific, engineering and practical fields, such as, classification and diagnosis (Hong & Tseng, 1991; Jia & Chong, 1995; Sadeghi, 2000; Wang & Li, 1991), image and signal processing (Steriti & Fiddy, 1993), control system design (Zhang & Wang, 2001, 2002), equations solving (Zhang, Jiang & Wang, 2002; Zhang & Ge, 2005; Zhang ABSTRACT
منابع مشابه
Bidirectional Backpropagation: Towards Biologically Plausible Error Signal Transmission in Neural Networks
The back-propagation (BP) algorithm has been considered the de-facto method for training deep neural networks. It back-propagates errors from the output layer to the hidden layers in an exact manner using the transpose of the feedforward weights. However, it has been argued that this is not biologically plausible because back-propagating error signals with the exact incoming weights is not cons...
متن کاملConvergence of Batch BP Algorithm with Penalty for FNN Training
Penalty methods have been commonly used to improve the generalization performance of feedforward neural networks and to control the magnitude of the network weights. Weight boundedness and convergence results are presented for the batch BP algorithm with penalty for training feedforward neural networks with a hidden layer. A key point of the proofs is the monotonicity of the error function with...
متن کاملAn evolutionary approach to training feedforward and recurrent neural networks
This paper describes a method of utilising genetic algorithms to train fixed architecture feed-forward and recurrent neural networks. The technique described uses the genetic algorithm to evolve changes to the weights and biases of the network rather than the weights and biases themselves. Results achieved by this technique indicate that for many problems it compares very favourably with the mo...
متن کاملConvergence of BP algorithm for product unit neural networks with exponential weights
Product unit neural networks with exponential weights (PUNNs) can provide more powerful internal representation capability than traditional feed-forward neural networks. In this paper, a convergence result of the back-propagation (BP) algorithm for training PUNNs is presented. The monotonicity of the error function in the training iteration process is also guaranteed. A numerical example is giv...
متن کاملMemory Capacity of a Novel Optical Neural Net Architecture
A new associative memory neural network which can be constructed using optical matched filters is described. It has three layers, the centre one being iterative with its weights set prior to training. The other two layers are feedforward nets and the weights are set during training. The best choice of central layer weights, or in optical terms, of pairs of images associated in a hologram is con...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015